On the Adaptive Elastic-net with a Diverging Number of Parameters.

نویسندگان

  • Hui Zou
  • Hao Helen Zhang
چکیده

We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property (Fan and Li, 2001; Fan and Peng, 2004) which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive Elastic-Net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive Elastic-Net. We show by simulations that the adaptive Elastic-Net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Quantile Regression with Adaptive Elastic Net Penalty for Longitudinal Data

Longitudinal studies include the important parts of epidemiological surveys, clinical trials and social studies. In longitudinal studies, measurement of the responses is conducted repeatedly through time. Often, the main goal is to characterize the change in responses over time and the factors that influence the change. Recently, to analyze this kind of data, quantile regression has been taken ...

متن کامل

General Estimating Equations: Model Selection and Estimation with Diverging Number of Parameters

This paper develops adaptive elastic net estimator for general estimating equations. We allow for number of parameters diverge to infinity. The estimator can also handle collinearity among large number of variables as well. This method has the oracle property, meaning we can estimate nonzero parameters with their standard limit and the redundant parameters are dropped from the equations simulta...

متن کامل

ON THE ADAPTIVE ELASTIC-NET WITH A DIVERGING NUMBER OF PARAMETERS By

We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property (Fan and Li, 2001; Fan and Peng, 2004) which ensures the optimal large sample performance. Furthermore, the highdimensionality often induces the collinearity problem which should be prope...

متن کامل

The adaptive Gril estimator with a diverging number of parameters

We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (AdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines ...

متن کامل

Investigation of flow and heat transfer of nanofluid in a diverging sinusoidal channel

Using of nanofluids and ducts with corrugated walls are both supposed to enhance heat transfer, by increasing the heat transfer fluid conductivity and the heat transfer area respectively. Use of a diverging duct with a jet at inlet section may further increase heat transfer by creating recirculation zones inside the duct. In this work two-dimensional incompressible laminar flow of a nanofluid e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Annals of statistics

دوره 37 4  شماره 

صفحات  -

تاریخ انتشار 2009